home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Business Shareware
/
Business Shareware.iso
/
start
/
gfxapps
/
techplt1
/
tpsht_10.hlp
< prev
next >
Wrap
Text File
|
1992-09-20
|
4KB
|
134 lines
** File: TPSht_10.hlp
** Index: 49
** More Sheet View Help
:: Fitting Methods
:: Parameter Statistics
█ Powell's method
This method is a hybrid of the
Gauss-Newton and steepest descent methods.
The algorithm used here is based on the
FORTRAN routine proposed by Powell [1].
During each iteration, both Gauss-Newton
and steepest descent corrections to the
current parameter estimates are calculated.
The actual correction is a linear
combination of these two. The derivatives
in the Gauss-Newton method are replaced by
differences (numerical differentiation).
For a detailed description of this method,
the reader is referred to Powell [1].
█ Marquardt-Levenburg method
This method was developed by Marquardt,
based on an earlier suggestion by
Levenburg, for varying smoothly between
the extremes of the inverse Hessian method
and the steepest descent method. The
later method is used far from the minimum,
switching continuously to the former as
the minimum is approached. This method
works very well in practice and has become
the standard for nonlinear least-squares
routines. This routine is implemented
based the numerical algorithm described
in Press et al [3].
█ Simplex method
The simplex method is a multi-direction
search method; it is due to Nelder and Mead
[8]. When the initial estimation of
parameters are far from the minimum that
all other methods fail, this method may
locate the region of the minimum.
Geometrically, a simplex in n-dimensional
space is a figure with n+1 vertices. The
algorithm constructs a simplex in the
parameter space, and associates each vertex
with the corresponding sum of squared
deviation. The strategy is for a given
new point in the parameter space, compare
it with the other n+1 vertices and get rid
of the worst (biggest) point. Then use the
new point to form another simplex. The
simplex requires only function evaluations,
not derivatives. It is not very efficient
in terms of the number of function
evaluations that it requires. However,
it may frequently be the best method when
you want to get something working quickly.
█ Best-fit Parameter Statistics
TechPlot provides best-fit parameter
statistics for linear and nonlinear curve
fitting. The following are the
descriptions of these options.
█ Covariance matrix
The formula for covariance matrix CVM
in terms of the Jacobian matrix is given on
page 118 of TechPlot user's handbook.
█ Goodness-of-fit statistics
This part contains information related
to the calculated best fit curve and
observed data regarding the statistics of
the fitting.
█ Coefficient of determination (COD)
The coefficient of determination (COD)
is a measure of the fraction of the total
variance accounted for by the model.
█ Correlation
The correlation between two variables
is an indication of how much changes in one
variable are related with changes in the
other.
This number is most appropriately
applied to linear regression as an
indication of how closely the two
variables approximate a linear
relationship to each other.
█ Model selection criterion (MSC)
The MSC is useful and more justifiable
for comparing the final least squares
fittings that two competing models produce
for the same observed data set.
█ Fitted-parameter statistics
This part contains information
related to the statistics of best
fitted parameters.
Standard deviation
Confidence regions
█ Fitted-data statistics
Sum of squared deviation
Degree of freedom
Mean square deviation
Confidence interval
Prediction interval
See Chapter 2: "tutorial", Section 8,
for an example of nonlinear curve fitting
and related parameter statistics.